|
In statistics, Hodges’ estimator (or the Hodges–Le Cam estimator), named for Joseph Hodges, is a famous counter example of an estimator which is "superefficient", i.e. it attains smaller asymptotic variance than regular efficient estimators. The existence of such a counterexample is the reason for the introduction of the notion of regular estimators. Hodges’ estimator improves upon a regular estimator at a single point. In general, any superefficient estimator may surpass a regular estimator at most on a set of Lebesgue measure zero. == Construction == Suppose is a "common" estimator for some parameter ''θ'': it is consistent, and converges to some asymptotic distribution ''Lθ'' (usually this is a normal distribution with mean zero and variance which may depend on ''θ'') at the : : \sqrt(\hat\theta_n - \theta)\ \xrightarrow\ L_\theta\ . Then the Hodges’ estimator is defined as : \hat\theta_n^H = \begin\hat\theta_n, & \text |\hat\theta_n| \geq n^, \text \\ 0, & \text |\hat\theta_n| < n^.\end This estimator is equal to everywhere except on the small interval , where it is equal to zero. It is not difficult to see that this estimator is consistent for ''θ'', and its asymptotic distribution is : & n^\alpha(\hat\theta_n^H - \theta) \ \xrightarrow\ 0, \qquad\text \theta = 0, \\ &\sqrt(\hat\theta_n^H - \theta)\ \xrightarrow\ L_\theta, \quad \text \theta\neq 0, \end for any ''α'' ∈ R. Thus this estimator has the same asymptotic distribution as for all , whereas for the rate of convergence becomes arbitrarily fast. This estimator is ''superefficient'', as it surpasses the asymptotic behavior of the efficient estimator at least at one point . In general, superefficiency may only be attained on a subset of measure zero of the parameter space Θ. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Hodges' estimator」の詳細全文を読む スポンサード リンク
|